Multi-Cue Cascades for Robust Visual Tracking
نویسندگان
چکیده
منابع مشابه
Robust Visual Tracking via Spatio-Temporal Cue Integration
Appearance modeling is an important and yet challenging issue for online visual tracking due to the accumulation of errors which is prone to potential drifting during the self-updating with newly obtained results. In this paper, we propose a novel online tracking algorithm using spatio-temporal cue integration. Specifically, the object is represented as a set of local patches with respect to th...
متن کاملMulti-cue based tracking
Visual tracking is a central topic in computer vision. However, the accurate localization of target object in extreme conditions (such as occlusion, scaling, illumination change, and shape transformation) still remains a challenge. In this paper, we explore utilizing multi-cue information to ensure a robust tracking. Optical flow, color and depth clues are simultaneously incorporated in our fra...
متن کاملRobust visual tracking using a fixed multi-camera system
The problem of tracking the position and orientation of a moving object using a stereo camera system is considered in this paper. A robust algorithm based on the extended Kalman filter is adopted, combined with an efficient selection technique of the object image features, based on Binary Space Partitioning tree geometric models. An experimental study is carried nut using a vision system of two...
متن کاملRobust Visual Person Tracking for Interactive Displays
We present an approach to real-time person tracking in crowded and/or unknown environments using integration of multiple visual modalities. We combine stereo, color, and face detection modules into a single robust system, and show an initial application in an interactive, faceresponsive display. Dense, real-time stereo processing is used to isolate users from other objects and people in the bac...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Access
سال: 2019
ISSN: 2169-3536
DOI: 10.1109/access.2019.2938187